Markov process
noun
: a stochastic process (such as Brownian motion) that resembles a Markov chain except that the states are continuous
Love words? Need even more definitions?
Merriam-Webster unabridged
Share